|
In statistics, the explained sum of squares (ESS), alternatively known as the model sum of squares or sum of squares due to regression ("SSR" – not to be confused with the residual sum of squares RSS), is a quantity used in describing how well a model, often a regression model, represents the data being modelled. In particular, the explained sum of squares measures how much variation there is in the modelled values and this is compared to the total sum of squares, which measures how much variation there is in the observed data, and to the residual sum of squares, which measures the variation in the modelling errors. ==Definition== The explained sum of squares (ESS) is the sum of the squares of the deviations of the predicted values from the mean value of a response variable, in a standard regression model — for example, , where ''y''''i'' is the ''i'' th observation of the response variable, ''x''''ji'' is the ''i'' th observation of the ''j'' th explanatory variable, ''a'' and ''b''''i'' are coefficients, ''i'' indexes the observations from 1 to ''n'', and ''ε''''i'' is the ''i'' th value of the error term. In general, the greater the ESS, the better the estimated model performs. If and are the estimated coefficients, then : is the ''i'' th predicted value of the response variable. The ESS is the sum of the squares of the differences of the predicted values and the mean value of the response variable: : In general: total sum of squares = explained sum of squares + residual sum of squares. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「explained sum of squares」の詳細全文を読む スポンサード リンク
|